Goto

Collaborating Authors

 false citation


Lawyer caught using AI-generated false citations in court case penalised in Australian first

The Guardian

A Victorian lawyer has become the first in Australia to face professional sanctions for using artificial intelligence in a court case, being stripped of his ability to practise as a principal lawyer after AI generated false citations that he had failed to verify. Guardian Australia reported in October last year that in a 19 July 2024 hearing, the anonymous solicitor representing a husband in a dispute between a married couple provided the court with a list of prior cases that had been requested by Justice Amanda Humphreys in relation to an enforcement application in the case. When Humphreys returned to her chambers, she said in a ruling that neither herself nor her associates were able to identify the cases in the list. When the matter returned to court the lawyer confirmed that the list had been prepared using legal software that utilised AI. He acknowledged he did not verify the accuracy of the information before submitting it to the court.


US lawyer sanctioned after caught using ChatGPT for court brief

The Guardian

The Utah court of appeals has sanctioned a lawyer after he was discovered to have used ChatGPT for a filing he made in which he referenced a nonexistent court case. Earlier this week, the Utah court of appeals made the decision to sanction Richard Bednar over claims that he filed a brief which included false citations. According to court documents reviewed by ABC4, Bednar and Douglas Durbano, another Utah-based lawyer who was serving as the petitioner's counsel, filed a "timely petition for interlocutory appeal". Upon reviewing the brief which was written by a law clerk, the respondent's counsel found several false citations of cases. "It appears that at least some portions of the Petition may be AI-generated, including citations and even quotations to at least one case that does not appear to exist in any legal database (and could only be found in ChatGPT and references to cases that are wholly unrelated to the referenced subject matter," the respondent's counsel said in documents reviewed by ABC4.